3,697 research outputs found

    Beyond Schumpeter vs. Arrow: How Antitrust Fosters Innovation

    Get PDF
    The relationship between competition and innovation is the subject of a familiar controversy in economics, between the Schumpeterian view that monopolies favor innovation and the opposite view, often associated with Kenneth Arrow, that competition favors innovation. Taking their cue from this debate, some commentators reserve judgment as to whether antitrust enforcement is good for innovation. Such misgivings are unnecessary. The modern economic learning about the connection between competition and innovation helps clarify the types of firm conduct and industry settings where antitrust interventions are most likely to foster innovation. Measured against this standard, contemporary competition policy holds up well. Today's antitrust institutions support innovation by targeting types of industries and practices where antitrust enforcement would enhance research and development incentives the most. It is time to move beyond the "on-the-one-hand Schumpeter, on-the-other-hand Arrow" debate and embrace antitrust as essential for fostering innovation.Technology and Industry

    The Case for Antitrust Enforcement

    Get PDF
    None.Technology and Industry, Regulatory Reform, Other Topics

    Evolution of the cluster abundance in non-Gaussian models

    Full text link
    We carry out N-body simulations of several non-Gaussian structure formation models, including Peebles' isocurvature cold dark matter model, cosmic string models, and a model with primordial voids. We compare the evolution of the cluster mass function in these simulations with that predicted by a modified version of the Press-Schechter formalism. We find that the Press-Schechter formula can accurately fit the cluster evolution over a wide range of redshifts for all of the models considered, with typical errors in the mass function of less than 25%, considerably smaller than the amount by which predictions for different models may differ. This work demonstrates that the Press-Schechter formalism can be used to place strong model independent constraints on non-Gaussianity in the universe.Comment: 11 pages, 12 postscipt figure

    Reinvigorating Horizontal Merger Enforcement

    Get PDF
    The past forty years have witnessed a remarkable transformation in horizontal merger enforcement in the United States. With no change in the underlying statute, the Clayton Act, the weight given to market concentration by the federal courts and by the federal antitrust agencies has declined dramatically. Instead, increasing weight has been given to three arguments often made by merging firms in their defense: entry, expansion and efficiencies. We document this shift and provide examples where courts have approved highly concentrating mergers based on limited evidence of entry and expansion. We show using merger enforcement data and a survey we conducted of merger practitioners that the decline in antitrust enforcement is ongoing, especially at the current Justice Department. We then argue in favor of reinvigorating horizontal merger enforcement by partially restoring the structural presumption and by requiring strong evidence to overcome the government's prima facie case. We propose several routes by which the government can establish its prima facie case, distinguishing between cases involving coordinated vs. unilateral anti-competitive effects.

    An automatic adaptive method to combine summary statistics in approximate Bayesian computation

    Full text link
    To infer the parameters of mechanistic models with intractable likelihoods, techniques such as approximate Bayesian computation (ABC) are increasingly being adopted. One of the main disadvantages of ABC in practical situations, however, is that parameter inference must generally rely on summary statistics of the data. This is particularly the case for problems involving high-dimensional data, such as biological imaging experiments. However, some summary statistics contain more information about parameters of interest than others, and it is not always clear how to weight their contributions within the ABC framework. We address this problem by developing an automatic, adaptive algorithm that chooses weights for each summary statistic. Our algorithm aims to maximize the distance between the prior and the approximate posterior by automatically adapting the weights within the ABC distance function. Computationally, we use a nearest neighbour estimator of the distance between distributions. We justify the algorithm theoretically based on properties of the nearest neighbour distance estimator. To demonstrate the effectiveness of our algorithm, we apply it to a variety of test problems, including several stochastic models of biochemical reaction networks, and a spatial model of diffusion, and compare our results with existing algorithms

    The impact of temporal sampling resolution on parameter inference for biological transport models

    Full text link
    Imaging data has become widely available to study biological systems at various scales, for example the motile behaviour of bacteria or the transport of mRNA, and it has the potential to transform our understanding of key transport mechanisms. Often these imaging studies require us to compare biological species or mutants, and to do this we need to quantitatively characterise their behaviour. Mathematical models offer a quantitative description of a system that enables us to perform this comparison, but to relate these mechanistic mathematical models to imaging data, we need to estimate the parameters of the models. In this work, we study the impact of collecting data at different temporal resolutions on parameter inference for biological transport models by performing exact inference for simple velocity jump process models in a Bayesian framework. This issue is prominent in a host of studies because the majority of imaging technologies place constraints on the frequency with which images can be collected, and the discrete nature of observations can introduce errors into parameter estimates. In this work, we avoid such errors by formulating the velocity jump process model within a hidden states framework. This allows us to obtain estimates of the reorientation rate and noise amplitude for noisy observations of a simple velocity jump process. We demonstrate the sensitivity of these estimates to temporal variations in the sampling resolution and extent of measurement noise. We use our methodology to provide experimental guidelines for researchers aiming to characterise motile behaviour that can be described by a velocity jump process. In particular, we consider how experimental constraints resulting in a trade-off between temporal sampling resolution and observation noise may affect parameter estimates.Comment: Published in PLOS Computational Biolog

    The Retirement Incentive Effects of Canada's Income Security Programs

    Get PDF
    Like most other developed nations, Canada has a large income security system for retirement that provides significant and widely varying disincentives to work at older ages. Empirical investigation of their effects has been hindered by lack of appropriate data. We provide an empirical analysis of the retirement incentives of the Canadian Income Security (IS) system using a new and comprehensive administrative data base. We find that the work disincentives inherent in the Canadian IS system have large and statistically significant impacts on retirement. This suggests that program reform can some play a role in responses to the fiscal crises these programs periodically experience. We also demonstrate the importance of controlling for lifetime earnings in retirement models. Specifications without these controls overestimate the effects of the IS system. Finally, our estimates vary in sensible ways across samples lending greater confidence to our estimates.
    corecore